Respondent Mode Choice in a Smartphone Survey, United States, 2012 (ICPSR 37836)
Version Date: Oct 8, 2020 View help for published
Principal Investigator(s): View help for Principal Investigator(s)
Frederick G. Conrad, University of Michigan. Institute for Social Research. Survey Research Center;
Michael F. Schober, The New School for Social Research (New York, N.Y.: 2005- ). Department of Psychology
https://doi.org/10.3886/ICPSR37836.v1
Version V1
Summary View help for Summary
Now that people on mobile devices can easily choose their mode of communication (e.g., voice, text, video) survey designers can allow respondents to answer questions in whatever mode they find momentarily convenient given their circumstances or that they chronically prefer. Investigators conducted an experiment to explore how mode choice affects response quality, participation, and satisfaction in smartphone interviews. Respondents were interviewed on their iPhone in one of four modes: Human Voice, Human Text, Automated Voice, and Automated Text. Respondents were either assigned the mode of their interview (Assigned Mode), in which case the contact and interviewing mode were the same, or they were required to choose the mode of their interview (Mode Choice) after being contacted in one of the four modes. 634 respondents completed the interview and a post-interview online debriefing questionnaire in the Assigned Mode group and 626 respondents completed the interview and online debriefing in the Assigned Mode group. This dataset contains 2691 cases, the 1,260 respondents who completed the interview and debriefing, as well as 1,431 cases that were invited to participate but ended their participation somewhere shy of the last debriefing question (either they did not choose a mode, did not answer the first question, started but did not finish the interview, or finished the interview but did not complete the debriefing). All respondents (who completed the interview) answered 32 questions from US social surveys. 13 interviewers from the University of Michigan Survey Research Center administered voice and text interviews (five administered interviews in both experimental conditions, three conducted only Assigned Mode interviews, and five conducted interviews in just the Mode Choice condition). Automated systems launched parallel text and voice interviews at the same time as the human interviews. Respondents who chose their interview modes provided more conscientious (fewer rounded and non-differentiated) answers, and they reported greater satisfaction with the interview. Although fewer respondents started the interview when given a choice of mode, a higher percentage of Mode Choice respondents who started the interview completed it. For certain mode transitions (e.g., from automated interview modes) there was no reduction in participation. The results demonstrate clear benefits and relatively few drawbacks resulting from mode choice, at least among these modes and with this sample of iPhone users, suggesting that further exploration of mode choice and the logistics of its implementation is warranted. Demographic variables include participants' gender, race, education level, and household income.
Citation View help for Citation
Export Citation:
Funding View help for Funding
Subject Terms View help for Subject Terms
Geographic Coverage View help for Geographic Coverage
Smallest Geographic Unit View help for Smallest Geographic Unit
Country
Distributor(s) View help for Distributor(s)
Time Period(s) View help for Time Period(s)
Date of Collection View help for Date of Collection
Data Collection Notes View help for Data Collection Notes
- This study was originally released in OpenICPSR, and can be found here.
- This study is related to ICPSR 37837 and ICPSR 37846.
Study Purpose View help for Study Purpose
Now that people on mobile devices can easily choose their mode of communication (e.g., voice, text, video) survey designers can allow respondents to answer questions in whatever mode they find momentarily convenient given their circumstances or that they chronically prefer. Investigators conducted an experiment to explore how mode choice affects response quality, participation, and satisfaction in smartphone interviews.
Study Design View help for Study Design
Respondents were interviewed on their iPhone in one of four modes: Human Voice, Human Text, Automated Voice, and Automated Text. Respondents were either assigned the mode of their interview (Assigned Mode), in which case the contact and interviewing mode were the same, or they were required to choose the mode of their interview (Mode Choice) after being contacted in one of the four modes. 634 respondents completed the interview and a post-interview online debriefing questionnaire in the Assigned Mode group and 626 respondents completed the interview and online debriefing in the Assigned Mode group.
This dataset contains 2691 cases, the 1,260 respondents who completed the interview and debriefing, as well as 1,431 cases that were invited to participate but ended their participation somewhere shy of the last debriefing question (either they did not choose a mode, did not answer the first question, started but did not finish the interview, or finished the interview but did not complete the debriefing). All respondents (who completed the interview) answered 32 questions from US social surveys. 13 interviewers from the University of Michigan Survey Research Center administered voice and text interviews (five administered interviews in both experimental conditions, three conducted only Assigned Mode interviews, and five conducted interviews in just the Mode Choice condition). Automated systems launched parallel text and voice interviews at the same time as the human interviews.
Sample View help for Sample
iPhone users were recruited from Craigslist, Facebook, Google Ads, and Amazon Mechanical Turk. They were asked to complete a screening questionnaire to determine if they were eligible to participate. To be eligible, one needed to be 21 or older, and own an iPhone with a US area code. Participants recruited were not intended to represent the US population, iPhone users, or smartphone users. The sample was designed to test experimental manipulations through random assignments and conditions on a consistent platform. Eligible participants who provided a telephone number in the screening questionnaire were sent a text message with a link to a web page. The web page captured the user-agent string to determine if the device was an iPhone. Once eligible, phone numbers were assigned a contact mode. Respondents in the Mode Choice group could choose to be interviewed in the contact mode or choose to be interviewed in one of the other three modes. Once the interview had been completed, respondents were sent a link via text message to a post-interview debriefing questionnaire concerning their experience. At the conclusion of the post-interview debriefing, respondents were sent a text message with a $20 iTunes gift code as a token of appreciation for their time.
Time Method View help for Time Method
Universe View help for Universe
iPhone users 21 years of age and older
Unit(s) of Observation View help for Unit(s) of Observation
Data Type(s) View help for Data Type(s)
Mode of Data Collection View help for Mode of Data Collection
Response Rates View help for Response Rates
The response rate below was calculated using American Association for Public Opinion Research Response Rate 2 (AAPOR RR2).
46.4% (654/1,409) - Mode Choice
50.5% (648/1,282) - Assigned Mode
HideOriginal Release Date View help for Original Release Date
2020-10-08
Version History View help for Version History
2020-10-08 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:
- Created variable labels and/or value labels.
- Created online analysis version with question text.
- Checked for undocumented or out-of-range codes.
Notes
These data are freely available to data users at ICPSR member institutions. The curation and dissemination of this study are provided by the institutional members of ICPSR. How do I access ICPSR data if I am not at a member institution?